Last updated over 1 year ago. What is this?

Nate Hagens defines 'existential risk' as a threat that poses the potential to fundamentally alter the trajectory of human civilization, or to even cause human extinction. From Hagens' perspective, these risks encompass both anthropogenic and natural scenarios, such as climate change, nuclear war, pandemics, and unchecked technological developments like artificial intelligence. He emphasizes the interconnectedness and complexity of these threats, highlighting how our overreliance on finite resources and unsustainable practices compounds their potential impact. For Hagens, existential risks are not just about catastrophic events, but about the broader implications for the continuity and quality of human life on Earth. By understanding and addressing these risks, he argues, humanity can steer towards a more resilient and flourishing future.

See also: exponential growth, nuclear weapon, population growth, nuclear exchange, mass extinction

Daniel Schmachtenberger “Bend Not Break Part 1: Energy Blindness” | The Great Simplification #05 52,903

Daniel Schmachtenberger: "Sensemaking, Uncertainty, and Purpose” | The Great Simplification #31 22,610

Daniel Schmachtenberger: "Bend Not Break Part 5" | The Great Simplification #50 17,051

Herman Daly: “Toward an Ecological Economics” | The Great Simplification #06 8,242

Tomas Björkman: "Metamodernism and The Future" | The Great Simplification #48 7,827

FAQs from Episodes 1-25 of The Great Simplification | Frankly #05 6,017

Martin Scheringer: "The Growing Threat from Chemical Pollution" 5,945

DJ White: "Ocean Effectivism" | The Great Simplification #51 3,264

Joan Diamond: "From Kool-aid to Lemonade" | The Great Simplification #28 2,406